[codex] fix(gui): respect model reasoning disablement#12021
[codex] fix(gui): respect model reasoning disablement#12021yzlu0917 wants to merge 1 commit intocontinuedev:mainfrom
Conversation
💡 Codex Reviewcontinue/core/llm/llms/Ollama.ts Line 514 in 85f7d1f
ℹ️ About Codex in GitHubYour team has set up Codex to review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with 👍. Codex can also answer questions or update the PR. Try commenting "@codex address that feedback". |
Summary
streamNormalInputfrom overriding chat models that explicitly setcompletionOptions.reasoning = falsereasoningflag for an Ollama model with reasoning disabledWhy
streamNormalInputalways wrotecompletionOptions.reasoningfrom the session toggle wheneverhasReasoningEnabledwas set. That worked for most providers, but it broke Ollama chat models that explicitly disable reasoning in config, because the GUI would still sendreasoning: trueand Ollama would reject the request with errors likedoes not support thinking.Validation
./node_modules/.bin/vitest run src/redux/thunks/streamResponse.test.tsinguigit diff --check./node_modules/.bin/tsc -p ./ --noEmitinguiand only hit the existing unrelatedOPENROUTER_HEADERSexport mismatch from../core/llm/llms/OpenRouter.tsCloses #11265
Summary by cubic
Respect model-level reasoning disablement in chat requests. Ollama models with
completionOptions.reasoning: falseno longer receive a forcedreasoningflag; the session toggle still applies to models that allow overrides.completionOptions.reasoning === falseto avoid Ollama “does not support thinking” errors.reasoningfor disabled Ollama models.Written for commit 85f7d1f. Summary will update on new commits.